Material : Learning an Invariant Hilbert Space for Domain Adaptation

نویسندگان

  • Samitha Herath
  • Mehrtash Harandi
  • Fatih Porikli
چکیده

L = Ld + λLu . (1) In Eq. 1, Ld is a measure of dissimilarity between labeled samples. The term Lu quantifies a notion of statistical difference between the source and target samples in the latent space. In brief, the cost Ld was based on the proposed generalized soft-margin loss, `β on labeled pairs. The statistical loss, Lu was based on Stein divergence, δs between source and target domain covariances in the latent space. Here, we intend to derive the derivative of the proposed `β and Lu. Note that all the variable dimensions and the notations are similar to what we have used in the main text.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Image alignment via kernelized feature learning

Machine learning is an application of artificial intelligence that is able to automatically learn and improve from experience without being explicitly programmed. The primary assumption for most of the machine learning algorithms is that the training set (source domain) and the test set (target domain) follow from the same probability distribution. However, in most of the real-world application...

متن کامل

Robust Unsupervised Domain Adaptation for Neural Networks via Moment Alignment

A novel approach for unsupervised domain adaptation for neural networks is proposed that relies on a metricbased regularization of the learning process. The metric-based regularization aims at domain-invariant latent feature representations by means of maximizing the similarity between domainspecific activation distributions. The proposed metric results from modifying an integral probability me...

متن کامل

Sample-oriented Domain Adaptation for Image Classification

Image processing is a method to perform some operations on an image, in order to get an enhanced image or to extract some useful information from it. The conventional image processing algorithms cannot perform well in scenarios where the training images (source domain) that are used to learn the model have a different distribution with test images (target domain). Also, many real world applicat...

متن کامل

Approximation of fixed points for a continuous representation of nonexpansive mappings in Hilbert spaces

This paper introduces an implicit scheme for a   continuous representation of nonexpansive mappings on a closed convex subset of a Hilbert space with respect to a   sequence of invariant means defined on an appropriate space of bounded, continuous real valued functions of the semigroup.   The main result is to    prove the strong convergence of the proposed implicit scheme to the unique solutio...

متن کامل

Shift Invariant Spaces and Shift Preserving Operators on Locally Compact Abelian Groups

We investigate shift invariant subspaces of $L^2(G)$, where $G$ is a locally compact abelian group. We show that every shift invariant space can be decomposed as an orthogonal sum of spaces each of which is generated by a single function whose shifts form a Parseval frame. For a second countable locally compact abelian group $G$ we prove a useful Hilbert space isomorphism, introduce range funct...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017